computing company
Potential Energy Advantage of Quantum Economy
Liu, Junyu, Jiang, Hansheng, Shen, Zuo-Jun Max
Energy cost is increasingly crucial in the modern computing industry with the wide deployment of large-scale machine learning models and language models. For the firms that provide computing services, low energy consumption is important both from the perspective of their own market growth and the government's regulations. In this paper, we study the energy benefits of quantum computing vis-a-vis classical computing. Deviating from the conventional notion of quantum advantage based solely on computational complexity, we redefine advantage in an energy efficiency context. Through a Cournot competition model constrained by energy usage, we demonstrate quantum computing firms can outperform classical counterparts in both profitability and energy efficiency at Nash equilibrium. Therefore quantum computing may represent a more sustainable pathway for the computing industry. Moreover, we discover that the energy benefits of quantum computing economies are contingent on large-scale computation. Based on real physical parameters, we further illustrate the scale of operation necessary for realizing this energy efficiency advantage.
- North America > Canada > Ontario > Toronto (0.14)
- North America > United States > Illinois > Cook County > Chicago (0.05)
- Asia > China > Hong Kong (0.04)
The Chatbots Are Here, and the Internet Industry Is in a Tizzy
The volatility of chatbots has made it impossible to predict their impact. The result is an industry gripped with the question: What do we do now? When Aaron Levie, the chief executive of Box, tried a new A.I. chatbot called ChatGPT in early December, it didn't take him long to declare, "We need people on this!" He cleared his calendar and asked employees to figure out how the technology, which instantly provides comprehensive answers to complex questions, could benefit Box, a cloud computing company that sells services that help businesses manage their online data. Mr. Levie's reaction to ChatGPT was typical of the anxiety -- and excitement -- over Silicon Valley's new new thing.
Nvidia To Scrap $40bn Takeover Of Chip Firm Arm: Report
US firm Nvidia is scrapping its $40 billion bid to buy UK mobile chip technology powerhouse Arm from SoftBank after persistent objections from regulators, the Financial Times reported Tuesday. Nvidia and SoftBank Group both declined to comment on the report, which cited three unnamed sources with direct knowledge of the deal. But the collapse would be no surprise, after recent speculation that the deal was on the verge of failure following pressure from US, UK and EU regulators concerned it would undermine competition. In December, US regulators filed a lawsuit seeking to block the merger, while British and European regulators had ordered probes into the deal. Japan's SoftBank Group announced in 2020 that it was selling Arm for up to $40 billion in a deal it hoped to complete in early 2022, subject to regulatory approvals. The value of the cash-and-shares deal has risen since as stock markets have rallied, with Nvidia's shares soaring.
- North America > United States > New York (0.07)
- North America > United States > California (0.07)
- Asia > Japan > Honshū > Kantō > Tokyo Metropolis Prefecture > Tokyo (0.07)
- Information Technology > Hardware (1.00)
- Law > Litigation (0.96)
US Sues To Block Chipmaker Nvidia's $40 Bn Merger With UK's Arm
US regulators filed a lawsuit Thursday to block the $40-billion merger of graphics chip star Nvidia with mobile chip technology powerhouse Arm Ltd, fearing it would undermine competition. The move comes as US President Joe Biden strives to ramp up domestic chip production to ease American industry's reliance on imports. "The proposed vertical deal would give one of the largest chip companies control over the computing technology and designs that rival firms rely on to develop their own competing chips," the Federal Trade Commission said in a release, calling chips "critical infrastructure." The world faces a global shortage of semiconductors, choking production of a wide range of products including automobiles, sending new and used car prices surging. The FTC echoed concerns expressed about the merger by regulators in the United Kingdom, who recently ordered an in-depth probe of the take-over.
- Europe > United Kingdom (0.53)
- North America > United States > California (0.06)
- Asia > Japan > Honshū > Kantō > Tokyo Metropolis Prefecture > Tokyo (0.06)
- Law (1.00)
- Government > Regional Government > North America Government > United States Government (1.00)
AI everywhere
In 1993, at the age of 30, he co-founded Nvidia and has occupied the top executive spot ever since. What began as a provider of relatively niche graphics processing units (GPUs) with a narrow field of general computing uses has evolved to become, arguably, the bedrock underlying the current AI market explosion. As Nvidia gears up for its eighth annual GPU Technology Conference (GTC), which happens May 8-11 in San Jose, the company has a lot to celebrate. Its stock price hit record highs this year, its tech was everywhere at CES, and in addition to general AI applications, it's found a new collection of deep-pocketed partners among automakers looking to usher in autonomous driving using neural networks powered by GPUs. I spoke to Huang about how the company got to where it is today, as well as what GTC has become in the general landscape and what it means to Nvidia.
Breakthroughs in Artificial Intelligence from 2014
The holy grail of artificial intelligence--creating software that comes close to mimicking human intelligence--remains far off. But 2014 saw major strides in machine learning software that can gain abilities from experience. Companies in sectors from biotech to computing turned to these new techniques to solve tough problems or develop new products. The most striking research results in AI came from the field of deep learning, which involves using crude simulated neurons to process data. Work in deep learning often focuses on images, which are easy for humans to understand but very difficult for software to decipher.
- North America > United States > California (0.05)
- Asia > China (0.05)
- Health & Medicine > Pharmaceuticals & Biotechnology (0.54)
- Health & Medicine > Therapeutic Area > Neurology (0.31)
Artificial intelligence in the cloud promises to be the next great disrupter
Cloud computing is still in its infancy. Most big companies are assessing how far to go in shifting computing tasks into the giant, centralised data centres that form the cloud, and it will be years before most existing computing workloads move to it, if ever. The case for the cloud so far has been based on its potential to lower computing costs and increase business flexibility. By tapping into a large cloud computing company, such as Amazon Web Services (AWS), clients have hoped to move more quickly and cheaply to increase or decrease their computing resources as their needs change. However, the cloud is evolving more quickly than many executives realise.
Artificial intelligence in the cloud promises to be the next great disrupter - FT.com
Cloud computing is still in its infancy. Most big companies are assessing how far to go in shifting computing tasks into the giant, centralised data centres that form the cloud, and it will be years before most existing computing workloads move to it, if ever. The case for the cloud so far has been based on its potential to lower computing costs and increase business flexibility. By tapping into a large cloud computing company, such as Amazon Web Services (AWS), clients have hoped to move more quickly and cheaply to increase or decrease their computing resources as their needs change. However, the cloud is evolving more quickly than many executives realise.
Moore's Law Is Dead. Now What?
Mobile apps, video games, spreadsheets, and accurate weather forecasts: that's just a sampling of the life-changing things made possible by the reliable, exponential growth in the power of computer chips over the past five decades. But in a few years technology companies may have to work harder to bring us advanced new use cases for computers. The continual cramming of more silicon transistors onto chips, known as Moore's Law, has been the feedstock of exuberant innovation in computing. But it looks to be slowing to a halt. "We have to ask, is this going to be a problem for areas like mobile devices, data centers, and self-driving cars?" says Thomas Wenisch, an assistant professor at the University of Michigan.
Artificial intelligence in the cloud promises to be the next great disrupter - FT.com
Cloud computing is still in its infancy. Most big companies are assessing how far to go in shifting computing tasks into the giant, centralised data centres that form the cloud, and it will be years before most existing computing workloads move to it, if ever. The case for the cloud so far has been based on its potential to lower computing costs and increase business flexibility. By tapping into a large cloud computing company, such as Amazon Web Services (AWS), clients have hoped to move more quickly and cheaply to increase or decrease their computing resources as their needs change. However, the cloud is evolving more quickly than many executives realise.